-
Notifications
You must be signed in to change notification settings - Fork 74
Make diffusion model conditioning more flexible #521
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
|
Thanks for the PR, I think this is a reasonable idea for advanced use cases. As this is another instance of multi-input networks (even though it's inside the inference network this time, and does not involve the adapter), we might want to include this in the discussion in #517. We might also think about:
Tagging @LarsKue for comment as well. |
One very general approach would be to break free from the fixed names, such as "inference_variables", and actually allow for:
|
I agree @vpratz. I added the same logic to both models. I am open to suggestions how the tensors should be passed to the subnet. To @stefanradev93 comment: I think, this flexibility is only needed for advanced users. So maybe we should not follow this general approach for now, as the fixed names help users to get started with BayesFlow. |
Absolutely, this is definitely a 2.>1.x idea. |
For now, it is okay as it is @stefanradev93? |
I think we need to document somewhere how this can be used (i.e., which inputs are passed to the network if It would be good to have a test in place for the |
Thanks @vpratz for the suggestions. I added the documentation. Regarding the test: As we do not have a network at the moment which can handle multiple inputs, I do not know a useful test for the |
@arrjon Thanks for the changes! |
@vpratz I added the test. Now the merge should be ready! :) |
Thanks a lot @arrjon . Sorry for being slow to review, and to always coming up with new things I didn't notice before. The |
I corrected the shape for the |
@vpratz now also the tests passed :) |
Great, I'll take a final look tomorrow and then merge it. |
The convention is to use parameter name with a `_shape` suffix.
The cost of the continuous models on the CI is too high
I have changed the naming for the shapes, and created a reduced test for testing the setting, as the inference network tests are really slow. |
Thanks @vpratz! |
I introduced a new keyword
concatenated_input
to thesubnet_kwargs
in the diffusion model. This keyword controls how inputs—such as parameters, noise, and condition—are fed into the model’s subnet.Previously, the model assumed all inputs were 1D vectors and concatenated them directly for the default MLP subnet. However, for more flexible architectures—such as subnets designed to preserve or induce spatial structures—this assumption doesn't hold. Now we can also return all inputs separately directly to the subnet.